Learn why companies need to understand AI’s promise and potential drawbacks for environmental sustainability.
More than 60% of companies are using AI to support or advance environmental sustainability. What are the use cases they're exploring, and how can businesses also guard against the environmental risks of AI?
Join Steve Odland and guest Andrew Jones, PhD, principal researcher at the Governance & Sustainability Center of The Conference Board, to learn about the environmental impact of AI-focused data centers, how companies are capitalizing on AI for sustainability, and how sustainability leaders can demonstrate the ROI of AI applications.
For more from The Conference Board:
C-Suite Perspectives is a series hosted by our President & CEO, Steve Odland. This weekly conversation takes an objective, data-driven look at a range of business topics aimed at executives. Listeners will come away with what The Conference Board does best: Trusted Insights for What’s Ahead®.
C-Suite Perspectives provides unique insights for C-Suite executives on timely topics that matter most to businesses as selected by The Conference Board. If you would like to suggest a guest for the podcast series, please email csuite.perspectives@conference-board.org. Note: As a non-profit organization under 501(c)(3) of the IRS Code, The Conference Board cannot promote or offer marketing opportunities to for-profit entities.
Steve Odland: [00:00:00] Welcome to C-Suite Perspectives, a signature series by The Conference Board. I'm Steve Odland from The Conference Board and the host of this podcast series, and in today's conversation, we're going to talk about AI and environmental sustainability. What are the risks? What are the opportunities, and what comes next?
Joining me today is Dr. Andrew Jones, principal researcher at The Conference Board's Governance & Sustainability Center. Andrew, welcome, as always.
Andrew Jones: Thanks so much for having me, Steve. A pleasure to be back once again.
Steve Odland: Yeah, so you just released this new report just a couple weeks ago on AI and the environment, and you did a great job of kind of outlining, I guess, two different lenses here. One lens dealing with the issue of how to make AI sustainable, and then the other issue of AI used for sustainability. Let's just talk through each one of these, Andrew. Let's start with sustainable AI. Give us an overview [00:01:00] of what you learned in your study.
Andrew Jones: Yeah, sure. Thanks, Steve. And it's, yeah, it's a pleasure to be here and share more on this work. Obviously, this has become such a timely issue, I think as AI has scaled, and I think more stakeholders and more people in general are paying attention to the environmental aspects of AI. And as you just hinted at, there's multiple dimensions here, right?
Yeah, in our work, in our research, in our programs, and in our recent report, we effectively break AI's environmental dimensions down into these two lenses, right? The first, as you said, is sustainable AI, managing and mitigating AI's environmental footprint. And the second, which I perhaps we'll come on to discuss later, is how we can even use AI technology for positive environmental outcomes?
So I think in terms of the first lens, if you will, sustainable AI, how do we understand and then, in turn, perhaps, manage and even mitigate AI's environmental footprint? I think this is a really fascinating area. And I think we've more and more realizing that AI and the development, the training, the running of AI models at scale can have quite big environmental costs and environmental impacts. Particularly, I think, around [00:02:00] electricity consumption and water consumption. And I think we can break down more what these mean in practice. But I think the issue of electricity and energy is perhaps particularly interesting as we're seeing data centers proliferate and scale at ever-accelerating rates.
So we did a lot of work, Steve, breaking down these different aspects and what they actually mean in practice and what we might expect to see coming on the agenda in a few years, and how that translates into C-suite and boardroom concerns.
Steve Odland: And there are multiple aspects to AI's impact on the environment, essentially. You've got, as you mentioned, a significant energy uptake. There's also water, but then there are the emissions impacts. Let's just take them one at a time. They're building these data centers. Well, we've had data centers for a long time. The issue is on this one that these data centers, used strictly for AI, consume multiples of the amounts of electricity than data centers used for normal functions. Talk about that.
Andrew Jones: Yeah, sure. And it's worth saying, Steve, in our work, we did a [00:03:00] survey of corporate sustainability leaders. And we asked them what aspect of AI's footprint concerned them the most. And there was a lot of options, and very few of them weren't concerned by something, but the top responses were pretty much data center, energy demand and consumption, and the greenhouse gas emissions that can potentially result from that energy.
So I think data centers are top of mind for a lot of people. And as you highlighted, data centers have been around for a long time, data centers really have been the backbone of cloud computing that we've seen in the last few years. But there's something different I think about the AI models. And I think we all know that AI models, particularly when it's large language models, general purpose, generative AI models, they're very energy-intensive to train and run. Training an AI model, particularly some of the more recent massive frontier models, it consumes a vast amount of energy. It really is brute force. It can involve weeks or even months of nonstop computation, specialized chips running in parallel around the clock, constantly calculating and recalculating.
And this, just in turn, obviously, it's very power-hungry. It burns a lot of [00:04:00] power, and that's the training of the model. And then also then, the term we use is inference, when the model is actually used by people, that in turn can also have a big energy cost. A query for an AI model might be very small, in terms of the energy it uses, but when you have millions or, even billions of users making requests, that triggers, in turn, trillions of calculations inside a model that must happen instantly.
So AI is running on these very specialized chips, designed for speed. They're drawing much more electricity than standard CPUs. And that high performance and high demand means high electrical load. And as AI models are getting larger and more scale, we're just seeing exponentially higher energy use, and it's requiring, in turn, a lot more data centers. And we're seeing a lot of capital expenditure by companies on data centers right now. They're popping up around the clock, really, as the AI infrastructure gets built for the next era.
Steve Odland: Well, the big AI folks are Google and Microsoft. Some of these companies are actually contracting with energy companies to dedicate whole [00:05:00] nuclear power plants to running their data centers, speaking to what you're saying. And so, if you put in a normal search, it's just a tiny little bit of data, and also a tiny little bit of energy usage.
But when you put in an AI search, what you have is that search scrubbing all, doing all these searches at the same time and then forming an answer. So what used to be just a little tiny bit of energy used on a search now has been converted to an AI search, and I think a lot of people just do it because it's a nice answer and you don't have to go through websites. But the amount, the multiples of energy being used is, for what used to be just a little search, is huge.
Andrew Jones: And I think what you've just described, I guess it's, part of the transformation that is about to come and we're starting to see in AI, AI's transformation of business, but also just everyday life, right? As more consumers use it to find products or more people use it to find the things that they used to find through other ways. It's a birth of [00:06:00] a new era. Behind that, there is this huge appetite for, almost an insatiable appetite for, power and energy and electricity to drive it.
And yet, it's worth saying, Steve, this landscape does change quickly, right? We're constantly seeing new chips and new technologies emerge that maybe may be more efficient, may reduce the cost, may reduce the energy costs, but even by increasing efficiency, it simply, still leads to more electricity demand, and more efficiency enables larger models and more workloads, right?
So I think this trend is here to stay. And we are going to see over the next, just the next few years, just in the US, we're going to see a lot more data centers, a lot of capital expenditure in this area. And if these data centers are running on fossil fuel, heavy energy grids, as they often are, that higher consumption is going to translate into greenhouse gas emissions, which as we know can drive warming. So there is a strong environmental aspect to all of this that I think is becoming more and more harder to ignore.
Steve Odland: Well, there's also the heat generated by the data centers, and that has to be cooled. It's a spiral of things [00:07:00] that goes beyond. And so, on one hand, they're adding a lot of capital to add the infrastructure to do that. And we have not expanded the electricity production capacity in this country for, I don't know, decades. Mostly because the incremental usage required was offset by the savings through conservation.
The, the move to smart appliances, LED bulbs, and all that, but that has now hit the wall, particularly with AI, and you're seeing this massive upsurge in it. And so there's carbon aspects to that, as you said. But then you also mentioned in the report the use of water. What does water have to do with it?
Andrew Jones: So you saw that water is a major part. It's interesting, because I think the electricity side of things, the energy side of things, and the emissions that follow has been probably the main environmental issue, I think, that's captured attention when it comes to AI and data centers. But water consumption also seems to be moving up the agenda. I think we're starting to see a bit more attention on this.
As you [00:08:00] highlighted, Steve, yeah, at big data centers, big AI-centric data centers, water is used for cooling. These chips are running very hot, and circulating water through is one way to keep them at the right temperature. The water that absorbs that heat is then transferred somewhere else, and some of it evaporates. And because salty water can corrode machinery, companies and data centers are using potable water directly from municipal supplies. So it requires a lot of water, to put it bluntly. And as data centers scale, more water is required.
I do think, we say this in a report, we do need to keep this in perspective. AI today actually only represents a very small share of total industrial water use. Some studies, quite a few studies, like to point out that, for example, the amount of water used to cool AI data centers is significantly lower than irrigating golf courses or agriculture or so on. So we should keep it in perspective, but water consumption is rising as models become more complex and we build more data centers.
And it's a tricky and more complex issue, I think, because it's a more local issue than perhaps emissions and electricity, right? As data centers are often [00:09:00] concentrated in very specific states and even within very specific regions within those states. And if they happen to be in locations that already have existing water stress or existing water challenges, these increased water demands can be a big problem. Water is municipal and community level, and often therefore, water withdrawals for data centers can have big local impacts. And we're starting to see those emerge, I think. And we're starting to see debates emerge around this. It's a very complex issue. And I think another one that's here to stay. Now I think this is moving up the strategic agenda, for sure.
Steve Odland: Yeah. And putting them in municipal areas, fight with water needs for people and homes and all of that. So the natural inclination is to say, well, just put them in rural areas. However, that makes them compete with agricultural needs, which feeds the country and secures national security. And so it's a challenge regardless of where you put them. But there's also this impact of the land use, and the spillover of that, too. And you go to some of these areas where these data centers pop up. I mean, these [00:10:00] are not pretty buildings. There are no windows. They are just massive concrete bunkers, and they're using acreage like you wouldn't believe. That's a big impact.
Andrew Jones: It really is. And I think is, perhaps, in our researching some of the land-use impacts of AI perhaps are right now less of a concern for some of our sustainability audience than the emissions or the water. But I think rising up the agenda, and I think you're so right, Steve, as we're seeing data centers get built out at scale. And they require a lot of materials, and sometimes they can be very carbon-intensive materials. And they contribute to these local issues in all kinds of ways. The rapid buildout, particularly, as you said, perhaps in rural areas or semi-rural areas have triggered a lot of zoning debates, and have triggered a lot of community, in some places, at least, some community pushback.
So I think this is something that the companies, and perhaps primarily applies to the tech firms and the data center builders, but I think it also applies to a lot of companies that are indirectly dependent on them. It is something that has to be considered, and it is a risk, I think, and a rising issue, for sure.
Steve Odland: Well, there's sound issues, too. A lot of these data centers [00:11:00] emit high-pitched noises from cooling fans and so forth. So if you live near a data center, it's not dissimilar to windmills that have bad bearings, but you're constantly dealing with this, it's either this low hum or shrieking over time. So, that's another whole thing with the local impact, as well. There's also, as you pointed out in the report, e-waste and hardware turnover. Talk about that.
Andrew Jones: Again, this perhaps might be one of those areas that's less recognized in the broader discussions around AI. But for sure, I mean, think AI, as you said, AI-centric data centers are a little different, right? In terms of the chips they use, the way they run, the way they operate, and often they have quite an accelerated hardware refresh cycle, right? So sometimes as short as sort of 18 months, 24 months, AI needs to be turned over, and chips need to be turned over and refreshed. It can create big volumes of electronic waste. And not that many facilities, really in the US or globally, are equipped to recover these sort of materials from high-density chips and advanced memory at scale.
So I think e-waste, while right [00:12:00] now it maybe isn't top-of-mind issue for many people when it comes to AI's environmental impact, I think it will rise, and it will become more of a sort of responsibility gap. And I think there'll be more issue on recycling, and how do we extend hardware life strategies? How do we build resale markets and clearer end-of-life accountability? I think all these issues are going to continue to rise and, particularly, as we see the tech become quite obsolete quite quickly.
Steve Odland: Yeah, all of that's a big concern. You can't pick up a newspaper without reading about rare earth minerals. And we have national security concerns with rare earth minerals. We need them for the chips themselves. We also need them for the batteries that go into all of this.
But that's another stress point on the environment because you either have to mine them here, or you're mining them somewhere on Earth, which releases carbon. And of course, all of the runoff and transportation issues with that. So lots of rare earth issues, too.
Andrew Jones: That is so right. AI hardware, right, it really does depend on a quite a narrow set of critical [00:13:00] minerals. I know my colleagues at The Conference Board in the ESF Center have done some fantastic work on critical minerals, often can be very environmentally harmful to extract, can be concentrated in certain regions, can even be quite geopolitically sensitive, right?
So I think this reliance creates a lot of potential risks. And I think unlike electricity or water, these constraints are maybe not immediately that visible. They might not be halting projects or surfacing through community debates but could represent, perhaps, a long-term environmental issue, a long-term scaling issue, a long-term resilience issue, so the critical minerals aspect to this is a really fascinating story. And also, yeah, a potentially big environmental impact, as well.
Steve Odland: Talking about the use of AI and the environmental sustainability issues around that. We're going to take a short break and be right back.
Welcome back to C-Suite Perspectives. I'm your host, Steve Odland, from The Conference Board, and I'm joined by Dr. Andrew Jones, principal researcher at The Conference Board's Governance & Sustainability Center.
So, Andrew, before the break, we were [00:14:00] talking about all of the environmental or sustainability issues around the expansion of AI. One of those you mentioned was the local areas where you see the intensity of data centers. For some reason, in Virginia seems to be the epicenter of that. Also Texas, California, and to a lesser extent, Illinois. Why Virginia, of all places, or Texas, of all places, for the concentration of these data centers?
Andrew Jones: You're so right, Steve. It's so interesting. And at our latest count, we think there's about 4,000 data centers across the entire US. And about a third of these are in just three of the states you mentioned: Virginia, Texas, and California. Virginia alone accounts for about 15% of all data centers in the US, which is a very remarkable concentration. And a lot of them are actually within a specific area of Virginia, northern Virginia, Loudon County, which is also known as "Data Center Alley."
So I think the way we have to understand this, Steve, is it's not coincidental, right? That data centers often are cited in strategic [00:15:00] locations where they can get good connectivity, reliable power, available land, proximity to major demand, right climate and right environmental conditions. And Virginia and Northern Virginia, I think, has emerged as a real global hub. It's the largest data center in the market and the world, represents a meaningful share of global capacity.
But we're also seeing concentrations elsewhere. Dallas Fort Worth in Texas has grown rapidly. And, of course, Silicon Valley in California. I think there's reasons why this is, and obviously there's a lot of positives to this, right? They're in these regions creating huge amount of economic opportunity, tax revenue new markets, and new customers.
But I think, from an environmental perspective, can also have obvious significant impacts when you've got so many data centers in a specific place. And the obvious one is straining local energy grids.
Steve Odland: Yeah. I've driven around Ashburn, Virginia, which is in Loudoun County, as you said. They are everywhere. And it's just really kind of eerie when you drive down a road and you see data center after data center, and you can tell the difference between data centers and any other kind of commercial building, cause [00:16:00] there are no ports for trucks, as you see with distribution centers, there's not a lot of activity. It's all buttoned up and automated and high security, and there's noise, as we said before.
Andrew Jones: Right, and it's, as you said, Steve, not a huge amount of people. They don't tend to require massive workforces, they are highly automated. It's more sort of a specialized, job. But yeah, they are in these particular reasons, huge concentrations, and that in practice, that means certain utility territories are facing big sort of load demands, big reliability concerns when it comes to providing reliable electricity.
And as you know, this can have strong local and even political effects, right? It feeds into debates around energy prices and energy affordability and energy access and energy security. So, there's a lot of consequences and a lot of implications to this sort of rapid, almost even exponential, build out of data centers.
Steve Odland: Well, and we don't talk about it, there are national security concerns, too, because the more we're reliant on those data centers, and the more they're sitting in one place, has a big target on it for [00:17:00] the country's enemies. And Ashburn, Virginia, is about a stone's throw from Washington, DC, and the defense suppliers are one of the big users of these data centers. It also makes it a target.
We've been talking about the impact of AI on the environment, let's go the other way. Let's talk about all the positive things that AI can do for sustainability.
Andrew Jones: Yeah, happy to, Steve. I think, as we started out this podcast, we noted our work separates out into these two lenses, looking at sustainable AI and AI for sustainability. And we even go so far as to argue you have to look at these lenses together. You can't just focus on the cost, the environmental costs and tradeoffs of AI without also acknowledging that AI itself, and the technology and the tools, can play a big role in actually addressing environmental challenges, being applied for positive environmental and sustainable outcomes.
I think normally AI, when you look at the underlying tech and what it really does, is well-suited to environmental challenges, right? Because many environmental problems are, they're data heavy, they're complex, they're dynamic, they're interconnected. And AI often [00:18:00] excels at detecting subtle patterns in messy data, which makes it perfect for some of the big sustainability challenges of the day.
So I think AI has a lot of potential here, and I think corporate sustainability leaders, while it's still perhaps relatively early days, this is still a new technology, are starting to integrate AI more into their work and apply it. Some of those big challenges we actually found in our survey that more than 60% of corporate sustainability leaders are already using AI for something environmental-related. It's worth saying that these are perhaps still early, nascent use cases, that they're tending to apply them for perhaps the most obvious uses, like reporting, automating reporting, automating data analysis, writing reports, disclosure, carbon accounting.
And these are natural early applications, where you've got fragmented data sets, repetitive workflows. AI's great for bringing the efficiency gains, but we've heard from our sustainability audience, and we note this in our work, that perhaps the more advanced applications are yet to come, where it comes to more sophisticated operations [00:19:00] and climate risk analysis in all these areas. I'm happy to dive into those a little bit further.
Steve Odland: Interesting, you said that 60%, I think 61% of companies tell us that they're using AI for some sort of sustainability reasons. But 40% are not. That's a pretty high number. I mean, I know it's more than half are using it, but gosh, you would think that, what is this, five years into the adoption of AI, that it would be nearly a hundred percent.
Is that just because the use of AI is mostly analytical, and people have their own tools and the frameworks, and it's just inertia?
Andrew Jones: I think it's a really great observation. I think you're right that you might've expected that number to be higher given, how mainstreamed AI has been and how both the urgency of the focus across the entire private sector. I think what it reflects is not an inertia or a lack of commitment on the part of our sustainability audiences, but I think just more a case of still building up that kind of that internal literacy, that internal fluency, that really [00:20:00] identifying, OK, where are the real use cases here? Where are the real returns going to be? How do we perhaps integrate AI into what are already quite complex processes and also quite fragmented and perhaps opaque data sets?
I think there's also a sort of a laying the groundwork for effective use of AI, but I think it is building and I think if we ran that survey again at the end of this year, I'm sure that 60% would be a lot higher. And I think the momentum is there, and we see that both in sustainability and some of the related functions, as well, like social impact and philanthropy. So I think it shows that the adoption of AI does require some work, right? It's not always easy. It's not always straightforward. There is some sort of integrative tissue and fabric that needs building, but it is happening, for sure. And I think it will continue to mature as a discipline within sustainability.
Steve Odland: So the biggest use that's in these reporting and analytical functions, you've said that, but there's also use in operations. Talk about that, for energy efficiency, waste and water management, all of that.[00:21:00]
Andrew Jones: A hundred percent. And we think and we hear from our members that the potentially highest-value sustainability and environmental use cases are here. That they sit beyond reporting and disclosure. They sit in really how we can really transform operations and logistics and supply chains and, for example, to use energy efficiency and grid optimization.
We know that targeted and effective use of AI through an analysis of data and looking at for efficiencies can really cut energy use, whether it's in buildings and industrial sites, even within data centers. And we've heard stories of AI systems actually reducing the power usage and cooling usage within data centers as they're applied to those data sets. And I think can also in the same vein be used to forecast energy use and improve the mix and improve efficiency. So there's so much potential there to be integrated much deeper into corporate operations.
And obviously that goes beyond just the sustainability team. The sustainability team has a key role to play there, but it's really the entire business and the entire sort of the business units and [00:22:00] operations, supply chains, logistics, et cetera, that all have an important role to play in bringing in AI in an effective and measurable way.
And you could add a lot more examples to that, I think of what AI can do for optimizing, or for logistics, routes, network op systems, supply chains, materials, innovation, recycling. There's so much here that AI can be applied for. And it's never easy. There's always bottlenecks. There's always challenges. But I think in the years to come, we'll start to see companies realize some quite measurable and tangible environmental gains from the application of AI.
Steve Odland: In our latest C-Suite Outlook survey, one of the key issues for CEOs and CFOs in the application of AI was demonstrating some sort of ROI. It's gone from, it's a typical kind of startup where few years ago, nobody could spell AI, and then it was rapid adoption. And then last year in the survey, CEOs were most concerned about falling behind, and so it was like, let's invest like crazy so that we don't fall behind.
This year they're looking for [00:23:00] ROI. And that showed up in your survey here as it relates to the use of AI for sustainability tools, because there were not clear returns on the deployment of those tools.
Andrew Jones: That's right. And I think the same, you're so right that I think we're seeing now, across the board, the issue of ROI is emerging as this central AI question, isn't it? And yeah, how do we, we've all adopted AI to some extent. How do we now integrate into our core processes, our core business in a way that can also be measured and show tangible returns and justify that investment? I think it's the same thing for the corporate sustainability function.
It's interesting cause there are a lot of examples out there of very tangible and measured improvements in sustainability through the use of AI that also translate into business returns. I'm thinking, for example, when companies have applied AI to their logistics and have optimized the routes of their fleets and reduced mileage, reduced fuel use, reduced emissions. That has a big return to the business, as well, in terms of costs and saved [00:24:00] costs and more efficient ways of operating.
So there's a strong ROI element to this, too, right? I think if AI can be leveraged by companies to advance sustainable and environmental goals, it's also going to contribute towards showing that broader business case, right? As we see efficiencies improve and consumption of fuel and materials reduce, I think that will play an important part in that. I think sustainability leaders, there's an opportunity here for them to be at the table and be part of these discussions and really make their voice heard.
Steve Odland: Any final thoughts or insights you'd like to share from your report?
Andrew Jones: As always, Steve, we've had a wide-ranging and a comprehensive discussion.
I'd just add that we've also done a lot of work in The Conference Board's Governance & Sustainability Center on AI oversight and AI governance and responsible AI. And one of the findings in our work, which perhaps isn't surprising given some of the things we spoke about, is the environmental issues have often been much lower in those discussions than perhaps some other big risk areas of AI, right? Like bias and transparency and privacy and security and [00:25:00] safety. Makes a lot of sense given those are the areas where regulators perhaps have of focus.
But I think as AI adoption accelerates, and some of the things we've been speaking about today become more apparent, I think we'll see perhaps environmental issues move up that agenda, as well. So I think this isn't just a sustainability story. This is a governance and oversight story, as well. So definitely a space we'll be watching and monitoring closely at The Conference Board.
Steve Odland: All right, we'll be watching it, too. And Andrew, they can find your report where?
Andrew Jones: I'd encourage anyone listening to visit the TCB website, tcb.org, and visit the Governance & Sustainability Center. You'll find our report there. Publicly available.
Steve Odland: It's worth the read, and lots of good data in there. Dr. Andrew Jones, thanks for being with us today and sharing the insights on your report.
Andrew Jones: Thanks so much for having me, Steve. Always, always a pleasure to join you.
Steve Odland: And thanks to all of you for listening to C-Suite Perspectives. I'm Steve Odland, and this series has been brought to you by The Conference Board.
C-Suite Perspectives / 17 Feb 2026
Learn why companies need to understand AI’s promise and potential drawbacks for environmental sustainability.
C-Suite Perspectives / 09 Feb 2026
Find out what's new in CEO and executive compensation, including say-on-pay practices and SEC regulations.
C-Suite Perspectives / 05 Feb 2026
Find out what Kevin Warsh’s nomination for Fed chair means for interest rates—and the potential Senate battle ahead.
C-Suite Perspectives / 02 Feb 2026
Learn what our C-Suite Outlook 2026 reveals for marketing and communications executives.
C-Suite Perspectives / 29 Jan 2026
Finding skilled talent remains difficult. Find out how CEOs and CHROs plan to develop their workforce this year.
C-Suite Perspectives / 27 Jan 2026
Consumer confidence has plummeted. What does it mean for the economy?